Learning input correlations through nonlinear temporally asymmetric Hebbian plasticity.
نویسندگان
چکیده
Triggered by recent experimental results, temporally asymmetric Hebbian (TAH) plasticity is considered as a candidate model for the biological implementation of competitive synaptic learning, a key concept for the experience-based development of cortical circuitry. However, because of the well known positive feedback instability of correlation-based plasticity, the stability of the resulting learning process has remained a central problem. Plagued by either a runaway of the synaptic efficacies or a greatly reduced sensitivity to input correlations, the learning performance of current models is limited. Here we introduce a novel generalized nonlinear TAH learning rule that allows a balance between stability and sensitivity of learning. Using this rule, we study the capacity of the system to learn patterns of correlations between afferent spike trains. Specifically, we address the question of under which conditions learning induces spontaneous symmetry breaking and leads to inhomogeneous synaptic distributions that capture the structure of the input correlations. To study the efficiency of learning temporal relationships between afferent spike trains through TAH plasticity, we introduce a novel sensitivity measure that quantifies the amount of information about the correlation structure in the input, a learning rule capable of storing in the synaptic weights. We demonstrate that by adjusting the weight dependence of the synaptic changes in TAH plasticity, it is possible to enhance the synaptic representation of temporal input correlations while maintaining the system in a stable learning regime. Indeed, for a given distribution of inputs, the learning efficiency can be optimized.
منابع مشابه
Learning temporal correlations in biologically-inspired aVLSI
Temporally-asymmetric Hebbian learning is a class of algorithms motivated by data from recent neurophysiology experiments. While traditional Hebbian learning rules use mean firing rates to drive learning, this new form of learning involves precise firing times. Hence, such algorithms can capture temporal spike correlations. We present circuits and methods to implement temporally-asymmetric Hebb...
متن کاملPredictive Sequence Learning in Recurrent Neocortical Circuits
Neocortical circuits are dominated by massive excitatory feedback: more than eighty percent of the synapses made by excitatory cortical neurons are onto other excitatory cortical neurons. Why is there such massive recurrent excitation in the neocortex and what is its role in cortical computation? Recent neurophysiological experiments have shown that the plasticity of recurrent neocortical synap...
متن کاملSymbol emergence by combining a reinforcement learning schema model with asymmetric synaptic plasticity
A novel integrative learning architecture, RLSM with a STDP network is described. This architecture models symbol emergence in an autonomous agent engaged in reinforcement learning tasks. The architecture consists of two constitutional learning architectures: a reinforcement learning schema model (RLSM) and a spike timing-dependent plasticity (STDP) network. RLSM is an incremental modular reinf...
متن کاملA Model Analysis of Temporally Asymmetric Hebbian Learning
Among a lot of models for learning in neural networks, Hebbian and anti-Hebbian learnings might be the most familiar ones. Although there are many variants, the most typical paradigms are such that when preand post-synaptic activations (firing) occur at the same time, synaptic efficacy is increased (Hebbian) or decreased (anti-Hebbian). According recent neurophysiological observations, however,...
متن کاملSpike-Timing-Dependent Hebbian Plasticity as Temporal Difference Learning
A spike-timing-dependent Hebbian mechanism governs the plasticity of recurrent excitatory synapses in the neocortex: synapses that are activated a few milliseconds before a postsynaptic spike are potentiated, while those that are activated a few milliseconds after are depressed. We show that such a mechanism can implement a form of temporal difference learning for prediction of input sequences....
متن کاملذخیره در منابع من
با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید
برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید
ثبت ناماگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید
ورودعنوان ژورنال:
- The Journal of neuroscience : the official journal of the Society for Neuroscience
دوره 23 9 شماره
صفحات -
تاریخ انتشار 2003